# Multi-domain applicability

Macbert4csc V2
Apache-2.0
macbert4csc_v2 is a model for Chinese spelling correction. It adopts a specific architecture and training strategy and performs well on multiple evaluation datasets. It is suitable for text correction tasks in various fields.
Large Language Model PyTorch Chinese
M
Macropodus
112
2
Provence Reranker Debertav3 V1
Provence is a lightweight context pruning model optimized for retrieval-augmented generation, especially suitable for Q&A scenarios.
Large Language Model English
P
naver
1,506
13
Whisper Base Hungarian V1
Hungarian speech recognition model fine-tuned based on OpenAI Whisper-base, trained on 1200 hours of Hungarian data, outperforming similar models
Speech Recognition Transformers Other
W
sarpba
26
7
Whisper Large V3 Myanmar
Apache-2.0
This model is an automatic speech recognition model fine-tuned on the Burmese speech dataset based on openai/whisper-large-v3, specifically designed for Burmese speech transcription.
Speech Recognition Transformers Other
W
chuuhtetnaing
172
1
Tweety 7b Dutch V24a
Apache-2.0
Tweety-7b-dutch is a foundational large language model specialized in Dutch, based on the Mistral architecture, optimized for Dutch text processing with a Dutch tokenizer.
Large Language Model Transformers Other
T
Tweeties
1,568
13
Translate Ar En V1.0 Hplt Opus
Arabic-English machine translation model trained on OPUS and HPLT data, available in both Marian and Hugging Face formats.
Machine Translation Transformers Supports Multiple Languages
T
HPLT
20
2
Kobart Korean Summarizer V2
Korean text summarization model trained on gogamza/kobart-base-v2, using 680,000 summary data points from AI Hub
Text Generation Transformers
K
gangyeolkim
164
0
Hubert Base Audioset
Audio representation model based on HuBERT architecture, pre-trained on the complete AudioSet dataset, suitable for general audio tasks
Audio Classification Transformers
H
ALM
345
2
Llama 2 7b Absa
Apache-2.0
ABSA model fine-tuned from Llama-2-7b, excels at identifying aspects in text and analyzing sentiment
Large Language Model Transformers Supports Multiple Languages
L
Orkhan
124
12
Dfm Sentence Encoder Large
MIT
This is a Danish text embedding model trained using the SimCSE method, developed as part of the Scandinavian Embedding Benchmark
Text Embedding Transformers Other
D
KennethEnevoldsen
66
2
Zero Shot Vanilla Bi Encoder
MIT
BERT-based dual-encoder model, specifically designed for zero-shot text classification tasks, trained on the UTCD dataset
Text Classification Transformers English
Z
claritylab
27
0
T5 Efficient Gc4 All German Small El32
MIT
A T5 model trained on the large-scale cleaned German Common Crawl corpus (GC4), specializing in German natural language processing tasks.
Large Language Model Transformers German
T
GermanT5
52
4
Bloom 350m German
MIT
A BLOOM-350m language model trained from scratch on German data, a small-scale version of the BLOOM series, focused on German text generation tasks.
Large Language Model Transformers German
B
malteos
26
0
Tibetan Roberta Causal Base
MIT
This is a Tibetan pre-trained causal language model based on the RoBERTa architecture, primarily designed for Tibetan text generation tasks.
Large Language Model Transformers Other
T
sangjeedondrub
156
5
Opus Mt Tc Big Tr En
This is a large neural machine translation model based on the Transformer architecture, specifically designed for translating from Turkish to English.
Machine Translation Transformers Supports Multiple Languages
O
Helsinki-NLP
98.62k
29
Opus Mt Tc Big It En
This is a neural machine translation model for Italian-to-English translation, part of the OPUS-MT project, using the transformer-big architecture.
Machine Translation Transformers Supports Multiple Languages
O
Helsinki-NLP
175
2
Distilbert Word2vec 256k MLM 250k
This model combines word2vec embeddings with the DistilBERT architecture, suitable for natural language processing tasks. The embedding layer is trained on large-scale corpora and remains frozen, while the model is fine-tuned via masked language modeling.
Large Language Model Transformers
D
vocab-transformers
21
0
Ner English Ontonotes
Flair's built-in English 18-class named entity recognition model, trained on the Ontonotes dataset with an F1 score of 89.27.
Sequence Labeling English
N
flair
175.71k
19
Opus Mt Es De
Apache-2.0
opus-mt-es-de is a Transformer-based machine translation model for Spanish to German, developed by the University of Helsinki NLP team.
Machine Translation Transformers Supports Multiple Languages
O
Helsinki-NLP
1,719
0
Opus Mt En Af
Apache-2.0
English to Afrikaans machine translation model trained on OPUS dataset, using transformer-align architecture
Machine Translation Transformers Supports Multiple Languages
O
Helsinki-NLP
460
0
Opus Mt Fr Ro
Apache-2.0
A French to Romanian neural machine translation model based on Transformer architecture, developed by the Helsinki-NLP team
Machine Translation Transformers Supports Multiple Languages
O
Helsinki-NLP
157
0
Opus Mt Fi Fr
Apache-2.0
A Transformer-based Finnish to French machine translation model developed by the Helsinki-NLP team, trained using the OPUS dataset.
Machine Translation Transformers Supports Multiple Languages
O
Helsinki-NLP
86
0
Opus Mt Es Nl
Apache-2.0
OPUS-MT machine translation model from Spanish to Dutch, based on Transformer architecture, using SentencePiece for preprocessing.
Machine Translation Transformers Supports Multiple Languages
O
Helsinki-NLP
73
0
Opus Mt Fi Lue
Apache-2.0
A Transformer-based machine translation model for Finnish to Lue, developed by Helsinki-NLP team and trained on OPUS dataset.
Machine Translation Transformers Other
O
Helsinki-NLP
13
0
Opus Mt Fi Cs
Apache-2.0
This is a Finnish-to-Czech machine translation model based on the Transformer architecture, developed by the Helsinki-NLP team.
Machine Translation Transformers Other
O
Helsinki-NLP
28
0
Opus Mt Fr Hr
Apache-2.0
This is a French-to-Croatian machine translation model based on the Transformer architecture, developed by the Helsinki-NLP team.
Machine Translation Transformers Supports Multiple Languages
O
Helsinki-NLP
21
0
Opus Mt Sq Sv
Apache-2.0
Transformer-based machine translation model for Albanian (sq) to Swedish (sv), developed by the Helsinki-NLP team
Machine Translation Transformers Other
O
Helsinki-NLP
15
0
Opus Mt Fr Ru
Apache-2.0
A French-to-Russian neural machine translation model based on Transformer architecture, developed by Helsinki-NLP team and trained on the OPUS dataset.
Machine Translation Transformers Supports Multiple Languages
O
Helsinki-NLP
2,057
0
Opus Mt Ss En
Apache-2.0
OPUS-MT machine translation model for Swati (ss) to English (en), based on Transformer architecture with SentencePiece preprocessing
Machine Translation Transformers Supports Multiple Languages
O
Helsinki-NLP
35
0
Opus Mt Ha En
Apache-2.0
This is a Hausa-to-English machine translation model based on the Transformer architecture, developed by the Helsinki-NLP team.
Machine Translation Transformers Supports Multiple Languages
O
Helsinki-NLP
200
1
Opus Mt Id En
Apache-2.0
A Transformer-based Indonesian-to-English machine translation model developed by the Helsinki-NLP team, trained on the OPUS dataset.
Machine Translation Transformers Supports Multiple Languages
O
Helsinki-NLP
29.53k
15
Opus Mt De He
Apache-2.0
OPUS-MT German to Hebrew machine translation model based on transformer-align architecture, using SentencePiece for preprocessing.
Machine Translation Transformers Supports Multiple Languages
O
Helsinki-NLP
17
0
Opus Mt Sl Es
Apache-2.0
This is a Transformer-based machine translation model specifically designed for translating Slovenian (sl) to Spanish (es).
Machine Translation Transformers Supports Multiple Languages
O
Helsinki-NLP
15
0
Opus Mt En Hu
Apache-2.0
English to Hungarian neural machine translation model trained on OPUS data, using transformer-align architecture
Machine Translation Transformers Supports Multiple Languages
O
Helsinki-NLP
1,949
2
Opus Mt En Fr
Apache-2.0
opus-mt-en-fr is an English-to-French machine translation model based on the transformer-align architecture, developed by the Helsinki-NLP team.
Machine Translation Supports Multiple Languages
O
Helsinki-NLP
325.86k
57
Opus Mt De Pl
Apache-2.0
German to Polish machine translation model trained on OPUS data, using transformer-align architecture
Machine Translation Transformers Supports Multiple Languages
O
Helsinki-NLP
1,103
0
Opus Mt It De
Apache-2.0
opus-mt-it-de is a Transformer-align based Italian-to-German machine translation model developed by the Helsinki-NLP team.
Machine Translation Transformers Supports Multiple Languages
O
Helsinki-NLP
1,070
0
Opus Mt En Mk
Apache-2.0
OPUS-MT English-to-Macedonian machine translation model, based on Transformer architecture, using SentencePiece for tokenization
Machine Translation Transformers Supports Multiple Languages
O
Helsinki-NLP
205
0
Opus Mt Sk Es
Apache-2.0
A Transformer-based Slovak to Spanish machine translation model developed by the Helsinki-NLP team, trained using the OPUS dataset.
Machine Translation Transformers Supports Multiple Languages
O
Helsinki-NLP
190
0
Opus Mt De Gaa
Apache-2.0
A German-to-Ga machine translation model provided by the OPUS-MT project, implemented based on the Transformer architecture.
Machine Translation Transformers Supports Multiple Languages
O
Helsinki-NLP
15
0
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
© 2025AIbase